翻訳と辞書
Words near each other
・ Variation and Evolution in Plants
・ Variation diminishing property
・ Variation in Australian English
・ Variation of information
・ Variation of parameters
・ Variation of Tasmanian vegetation from East to West
・ Variation of the field
・ Variation of Trusts Act 1958
・ Variation on a Waltz by Diabelli (Liszt)
・ Variation ratio
・ Variation Selectors (Unicode block)
・ Variation Selectors Supplement
・ Variation suite
・ Variational
・ Variational analysis
Variational Bayesian methods
・ Variational bicomplex
・ Variational inequality
・ Variational integrator
・ Variational message passing
・ Variational method (quantum mechanics)
・ Variational methods in general relativity
・ Variational Monte Carlo
・ Variational perturbation theory
・ Variational principle
・ Variational properties
・ Variational transition-state theory
・ Variational vector field
・ Variations (Andrew Lloyd Webber album)
・ Variations (ballet)


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Variational Bayesian methods : ウィキペディア英語版
Variational Bayesian methods

Variational Bayesian methods are a family of techniques for approximating intractable integrals arising in Bayesian inference and machine learning. They are typically used in complex statistical models consisting of observed variables (usually termed "data") as well as unknown parameters and latent variables, with various sorts of relationships among the three types of random variables, as might be described by a graphical model. As is typical in Bayesian inference, the parameters and latent variables are grouped together as "unobserved variables". Variational Bayesian methods are primarily used for two purposes:
#To provide an analytical approximation to the posterior probability of the unobserved variables, in order to do statistical inference over these variables.
#To derive a lower bound for the marginal likelihood (sometimes called the "evidence") of the observed data (i.e. the marginal probability of the data given the model, with marginalization performed over unobserved variables). This is typically used for performing model selection, the general idea being that a higher marginal likelihood for a given model indicates a better fit of the data by that model and hence a greater probability that the model in question was the one that generated the data. (See also the Bayes factor article.)
In the former purpose (that of approximating a posterior probability), variational Bayes is an alternative to Monte Carlo sampling methods — particularly, Markov chain Monte Carlo methods such as Gibbs sampling — for taking a fully Bayesian approach to statistical inference over complex distributions that are difficult to directly evaluate or sample from. In particular, whereas Monte Carlo techniques provide a numerical approximation to the exact posterior using a set of samples, Variational Bayes provides a locally-optimal, exact analytical solution to an approximation of the posterior.
Variational Bayes can be seen as an extension of the EM (expectation-maximization) algorithm from maximum a posteriori estimation (MAP estimation) of the single most probable value of each parameter to fully Bayesian estimation which computes (an approximation to) the entire posterior distribution of the parameters and latent variables. As in EM, it finds a set of optimal parameter values, and it has the same alternating structure as does EM, based on a set of interlocked (mutually dependent) equations that cannot be solved analytically.
For many applications, variational Bayes produces solutions of comparable accuracy to Gibbs sampling at greater speed. However, deriving the set of equations used to iteratively update the parameters often requires a large amount of work compared with deriving the comparable Gibbs sampling equations. This is the case even for many models that are conceptually quite simple, as is demonstrated below in the case of a basic non-hierarchical model with only two parameters and no latent variables.
==Mathematical derivation of the mean-field approximation==
In variational inference, the posterior distribution over a set of unobserved variables \mathbf = \ given some data \mathbf is approximated
by a variational distribution, Q(\mathbf):
:P(\mathbf\mid \mathbf) \approx Q(\mathbf).
The distribution Q(\mathbf) is restricted to belong to a family of distributions of simpler
form than P(\mathbf\mid \mathbf), selected with the intention of making Q(\mathbf) similar to the true posterior, P(\mathbf\mid \mathbf). The lack of similarity is measured in terms of
a dissimilarity function d(Q; P) and hence inference is performed by selecting the distribution
Q(\mathbf) that minimizes d(Q; P).
The most common type of variational Bayes, known as ''mean-field variational Bayes'', uses the Kullback–Leibler divergence (KL-divergence) of ''P'' from ''Q'' as the choice of dissimilarity function. This choice makes this minimization tractable. The KL-divergence is defined as
:D_ Q(\mathbf) \log \frac)}.
Note that ''Q'' and ''P'' are reversed from what one might expect. This use of reversed KL-divergence is conceptually similar to the expectation-maximization algorithm. (Using the KL-divergence in the other way produces the expectation propagation algorithm.)
The KL-divergence can be written as
:D_ Q(\mathbf) \log \frac)} + \log P(\mathbf),
or
:
\log P(\mathbf) = D_ Q(\mathbf) \log \frac)} = D_(Q).

As the ''log evidence'' \log P(\mathbf) is fixed with respect to Q, maximizing the final term \mathcal(Q) minimizes the KL divergence of P from Q. By appropriate choice of Q, \mathcal(Q) becomes tractable to compute and to maximize. Hence we have both an analytical approximation Q for the posterior P(\mathbf\mid \mathbf), and a lower bound \mathcal(Q) for the evidence \log P(\mathbf). The lower bound \mathcal(Q) is known as the (negative) ''variational free energy'' because it can also be expressed as an "energy" \operatorname_(P(\mathbf,\mathbf) ) plus the entropy of Q.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Variational Bayesian methods」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.